Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 80
Filtrar
1.
Anal Bioanal Chem ; 416(5): 1249-1267, 2024 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-38289355

RESUMEN

Non-targeted analysis (NTA) is an increasingly popular technique for characterizing undefined chemical analytes. Generating quantitative NTA (qNTA) concentration estimates requires the use of training data from calibration "surrogates," which can yield diminished predictive performance relative to targeted analysis. To evaluate performance differences between targeted and qNTA approaches, we defined new metrics that convey predictive accuracy, uncertainty (using 95% inverse confidence intervals), and reliability (the extent to which confidence intervals contain true values). We calculated and examined these newly defined metrics across five quantitative approaches applied to a mixture of 29 per- and polyfluoroalkyl substances (PFAS). The quantitative approaches spanned a traditional targeted design using chemical-specific calibration curves to a generalizable qNTA design using bootstrap-sampled calibration values from "global" chemical surrogates. As expected, the targeted approaches performed best, with major benefits realized from matched calibration curves and internal standard correction. In comparison to the benchmark targeted approach, the most generalizable qNTA approach (using "global" surrogates) showed a decrease in accuracy by a factor of ~4, an increase in uncertainty by a factor of ~1000, and a decrease in reliability by ~5%, on average. Using "expert-selected" surrogates (n = 3) instead of "global" surrogates (n = 25) for qNTA yielded improvements in predictive accuracy (by ~1.5×) and uncertainty (by ~70×) but at the cost of further-reduced reliability (by ~5%). Overall, our results illustrate the utility of qNTA approaches for a subclass of emerging contaminants and present a framework on which to develop new approaches for more complex use cases.

2.
Environ Int ; 178: 108097, 2023 08.
Artículo en Inglés | MEDLINE | ID: mdl-37478680

RESUMEN

Exposure science is evolving from its traditional "after the fact" and "one chemical at a time" approach to forecasting chemical exposures rapidly enough to keep pace with the constantly expanding landscape of chemicals and exposures. In this article, we provide an overview of the approaches, accomplishments, and plans for advancing computational exposure science within the U.S. Environmental Protection Agency's Office of Research and Development (EPA/ORD). First, to characterize the universe of chemicals in commerce and the environment, a carefully curated, web-accessible chemical resource has been created. This DSSTox database unambiguously identifies >1.2 million unique substances reflecting potential environmental and human exposures and includes computationally accessible links to each compound's corresponding data resources. Next, EPA is developing, applying, and evaluating predictive exposure models. These models increasingly rely on data, computational tools like quantitative structure activity relationship (QSAR) models, and machine learning/artificial intelligence to provide timely and efficient prediction of chemical exposure (and associated uncertainty) for thousands of chemicals at a time. Integral to this modeling effort, EPA is developing data resources across the exposure continuum that includes application of high-resolution mass spectrometry (HRMS) non-targeted analysis (NTA) methods providing measurement capability at scale with the number of chemicals in commerce. These research efforts are integrated and well-tailored to support population exposure assessment to prioritize chemicals for exposure as a critical input to risk management. In addition, the exposure forecasts will allow a wide variety of stakeholders to explore sustainable initiatives like green chemistry to achieve economic, social, and environmental prosperity and protection of future generations.


Asunto(s)
Contaminantes Ambientales , Estados Unidos , Humanos , Contaminantes Ambientales/análisis , United States Environmental Protection Agency , Inteligencia Artificial , Gestión de Riesgos , Incertidumbre , Exposición a Riesgos Ambientales/análisis , Medición de Riesgo
4.
Front Toxicol ; 5: 1051483, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-36742129

RESUMEN

Understanding the metabolic fate of a xenobiotic substance can help inform its potential health risks and allow for the identification of signature metabolites associated with exposure. The need to characterize metabolites of poorly studied or novel substances has shifted exposure studies towards non-targeted analysis (NTA), which often aims to profile many compounds within a sample using high-resolution liquid-chromatography mass-spectrometry (LCMS). Here we evaluate the suitability of suspect screening analysis (SSA) liquid-chromatography mass-spectrometry to inform xenobiotic chemical metabolism. Given a lack of knowledge of true metabolites for most chemicals, predictive tools were used to generate potential metabolites as suspect screening lists to guide the identification of selected xenobiotic substances and their associated metabolites. Thirty-three substances were selected to represent a diverse array of pharmaceutical, agrochemical, and industrial chemicals from Environmental Protection Agency's ToxCast chemical library. The compounds were incubated in a metabolically-active in vitro assay using primary hepatocytes and the resulting supernatant and lysate fractions were analyzed with high-resolution LCMS. Metabolites were simulated for each compound structure using software and then combined to serve as the suspect screening list. The exact masses of the predicted metabolites were then used to select LCMS features for fragmentation via tandem mass spectrometry (MS/MS). Of the starting chemicals, 12 were measured in at least one sample in either positive or negative ion mode and a subset of these were used to develop the analysis workflow. We implemented a screening level workflow for background subtraction and the incorporation of time-varying kinetics into the identification of likely metabolites. We used haloperidol as a case study to perform an in-depth analysis, which resulted in identifying five known metabolites and five molecular features that represent potential novel metabolites, two of which were assigned discrete structures based on in silico predictions. This workflow was applied to five additional test chemicals, and 15 molecular features were selected as either reported metabolites, predicted metabolites, or potential metabolites without a structural assignment. This study demonstrates that in some-but not all-cases, suspect screening analysis methods provide a means to rapidly identify and characterize metabolites of xenobiotic chemicals.

5.
Environ Sci Technol ; 57(8): 3075-3084, 2023 Feb 28.
Artículo en Inglés | MEDLINE | ID: mdl-36796018

RESUMEN

Several thousand intentional and unintentional chemical releases occur annually in the U.S., with the contents of almost 30% being of unknown composition. When targeted methods are unable to identify the chemicals present, alternative approaches, including non-targeted analysis (NTA) methods, can be used to identify unknown analytes. With new and efficient data processing workflows, it is becoming possible to achieve confident chemical identifications via NTA in a timescale useful for rapid response (typically 24-72 h after sample receipt). To demonstrate the potential usefulness of NTA in rapid response situations, we have designed three mock scenarios that mimic real-world events, including a chemical warfare agent attack, the contamination of a home with illicit drugs, and an accidental industrial spill. Using a novel, focused NTA method that utilizes both existing and new data processing/analysis methods, we have identified the most important chemicals of interest in each of these designed mock scenarios in a rapid manner, correctly assigning structures to more than half of the 17 total features investigated. We have also identified four metrics (speed, confidence, hazard information, and transferability) that successful rapid response analytical methods should address and have discussed our performance for each metric. The results reveal the usefulness of NTA in rapid response scenarios, especially when unknown stressors need timely and confident identification.

7.
Anal Bioanal Chem ; 415(1): 35-44, 2023 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-36435841

RESUMEN

Non-targeted analysis (NTA) using high-resolution mass spectrometry allows scientists to detect and identify a broad range of compounds in diverse matrices for monitoring exposure and toxicological evaluation without a priori chemical knowledge. NTA methods present an opportunity to describe the constituents of a sample across a multidimensional swath of chemical properties, referred to as "chemical space." Understanding and communicating which region of chemical space is extractable and detectable by an NTA workflow, however, remains challenging and non-standardized. For example, many sample processing and data analysis steps influence the types of chemicals that can be detected and identified. Accordingly, it is challenging to assess whether analyte non-detection in an NTA study indicates true absence in a sample (above a detection limit) or is a false negative driven by workflow limitations. Here, we describe the need for accessible approaches that enable chemical space mapping in NTA studies, propose a tool to address this need, and highlight the different ways in which it could be implemented in NTA workflows. We identify a suite of existing predictive and analytical tools that can be used in combination to generate scores that describe the likelihood a compound will be detected and identified by a given NTA workflow based on the predicted chemical space of that workflow. Higher scores correspond to a higher likelihood of compound detection and identification in a given workflow (based on sample extraction, data acquisition, and data analysis parameters). Lower scores indicate a lower probability of detection, even if the compound is truly present in the samples of interest. Understanding the constraints of NTA workflows can be useful for stakeholders when results from NTA studies are used in real-world applications and for NTA researchers working to improve their workflow performance. The hypothetical ChemSpaceTool suggested herein could be used in both a prospective and retrospective sense. Prospectively, the tool can be used to further curate screening libraries and set identification thresholds. Retrospectively, false detections can be filtered by the plausibility of the compound identification by the selected NTA method, increasing the confidence of unknown identifications. Lastly, this work highlights the chemometric needs to make such a tool robust and usable across a wide range of NTA disciplines and invites others who are working on various models to participate in the development of the ChemSpaceTool. Ultimately, the development of a chemical space mapping tool strives to enable further standardization of NTA by improving method transparency and communication around false detection rates, thus allowing for more direct method comparisons between studies and improved reproducibility. This, in turn, is expected to promote further widespread applications of NTA beyond research-oriented settings.


Asunto(s)
Estudios Retrospectivos , Reproducibilidad de los Resultados , Estudios Prospectivos , Espectrometría de Masas/métodos , Estándares de Referencia
8.
J Expo Sci Environ Epidemiol ; 32(6): 820-832, 2022 11.
Artículo en Inglés | MEDLINE | ID: mdl-36435938

RESUMEN

The rapid characterization of risk to humans and ecosystems from exogenous chemicals requires information on both hazard and exposure. The U.S. Environmental Protection Agency's ToxCast program and the interagency Tox21 initiative have screened thousands of chemicals in various high-throughput (HT) assay systems for in vitro bioactivity. EPA's ExpoCast program is developing complementary HT methods for characterizing the human and ecological exposures necessary to interpret HT hazard data in a real-world risk context. These new approach methodologies (NAMs) for exposure include computational and analytical tools for characterizing multiple components of the complex pathways chemicals take from their source to human and ecological receptors. Here, we analyze the landscape of exposure NAMs developed in ExpoCast in the context of various chemical lists of scientific and regulatory interest, including the ToxCast and Tox21 libraries and the Toxic Substances Control Act (TSCA) inventory. We examine the landscape of traditional and exposure NAM data covering chemical use, emission, environmental fate, toxicokinetics, and ultimately external and internal exposure. We consider new chemical descriptors, machine learning models that draw inferences from existing data, high-throughput exposure models, statistical frameworks that integrate multiple model predictions, and non-targeted analytical screening methods that generate new HT monitoring information. We demonstrate that exposure NAMs drastically improve the coverage of the chemical landscape compared to traditional approaches and recommend a set of research activities to further expand the development of HT exposure data for application to risk characterization. Continuing to develop exposure NAMs to fill priority data gaps identified here will improve the availability and defensibility of risk-based metrics for use in chemical prioritization and screening. IMPACT: This analysis describes the current state of exposure assessment-based new approach methodologies across varied chemical landscapes and provides recommendations for filling key data gaps.


Asunto(s)
Ecosistema , Estados Unidos , Humanos
9.
Environ Int ; 167: 107385, 2022 09.
Artículo en Inglés | MEDLINE | ID: mdl-35952468

RESUMEN

BACKGROUND: Environmental health research has recently undergone a dramatic shift, with ongoing technological advancements allowing for broader coverage of exposure and molecular biology signatures. Approaches to integrate such measures are still needed to increase understanding between systems-level exposure and biology. OBJECTIVES: We address this gap by evaluating placental tissues to identify novel chemical-biological interactions associated with preeclampsia. This study tests the hypothesis that understudied chemicals are present in the human placenta and associated with preeclampsia-relevant disruptions, including overall case status (preeclamptic vs. normotensive patients) and underlying transcriptomic/epigenomic signatures. METHODS: A non-targeted analysis based on high-resolution mass spectrometry was used to analyze placental tissues from a cohort of 35 patients with preeclampsia (n = 18) and normotensive (n = 17) pregnancies. Molecular feature data were prioritized for confirmation based on association with preeclampsia case status and confidence of chemical identification. All molecular features were evaluated for relationships to mRNA, microRNA, and CpG methylation (i.e., multi-omic) signature alterations involved in preeclampsia. RESULTS: A total of 183 molecular features were identified with significantly differentiated abundance in placental extracts of preeclamptic patients; these features clustered into distinct chemical groupings using unsupervised methods. Of these features, 53 were identified (mapping to 40 distinct chemicals) using chemical standards, fragmentation spectra, and chemical metadata. In general, human metabolites had the largest feature intensities and strongest associations with preeclampsia-relevant multi-omic changes. Exogenous drugs were second most abundant and had fewer associations with multi-omic changes. Other exogenous chemicals (non-drugs) were least abundant and had the fewest associations with multi-omic changes. CONCLUSIONS: These global data trends suggest that human metabolites are heavily intertwined with biological processes involved in preeclampsia etiology, while exogenous chemicals may still impact select transcriptomic/epigenomic processes. This study serves as a demonstration of merging systems exposures with systems biology to better understand chemical-disease relationships.


Asunto(s)
Preeclampsia , Estudios de Cohortes , Epigenómica , Femenino , Humanos , Placenta/metabolismo , Preeclampsia/genética , Preeclampsia/metabolismo , Embarazo , Transcriptoma
10.
Anal Bioanal Chem ; 414(22): 6455-6471, 2022 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-35796784

RESUMEN

Non-targeted analysis (NTA) using high-resolution mass spectrometry has enabled the detection and identification of unknown and unexpected compounds of interest in a wide range of sample matrices. Despite these benefits of NTA methods, standardized procedures do not yet exist for assessing performance, limiting stakeholders' abilities to suitably interpret and utilize NTA results. Herein, we first summarize existing performance assessment metrics for targeted analyses to provide context and clarify terminology that may be shared between targeted and NTA methods (e.g., terms such as accuracy, precision, sensitivity, and selectivity). We then discuss promising approaches for assessing NTA method performance, listing strengths and key caveats for each approach, and highlighting areas in need of further development. To structure the discussion, we define three types of NTA study objectives: sample classification, chemical identification, and chemical quantitation. Qualitative study performance (i.e., focusing on sample classification and/or chemical identification) can be assessed using the traditional confusion matrix, with some challenges and limitations. Quantitative study performance can be assessed using estimation procedures developed for targeted methods with consideration for additional sources of uncontrolled experimental error. This article is intended to stimulate discussion and further efforts to develop and improve procedures for assessing NTA method performance. Ultimately, improved performance assessments will enable accurate communication and effective utilization of NTA results by stakeholders.


Asunto(s)
Espectrometría de Masas , Espectrometría de Masas/métodos
11.
Sci Data ; 9(1): 314, 2022 06 16.
Artículo en Inglés | MEDLINE | ID: mdl-35710792

RESUMEN

Direct monitoring of chemical concentrations in different environmental and biological media is critical to understanding the mechanisms by which human and ecological receptors are exposed to exogenous chemicals. Monitoring data provides evidence of chemical occurrence in different media and can be used to inform exposure assessments. Monitoring data provide required information for parameterization and evaluation of predictive models based on chemical uses, fate and transport, and release or emission processes. Finally, these data are useful in supporting regulatory chemical assessment and decision-making. There are a wide variety of public monitoring data available from existing government programs, historical efforts, public data repositories, and peer-reviewed literature databases. However, these data are difficult to access and analyze in a coordinated manner. Here, data from 20 individual public monitoring data sources were extracted, curated for chemical and medium, and harmonized into a sustainable machine-readable data format for support of exposure assessments.

12.
Anal Bioanal Chem ; 414(17): 4919-4933, 2022 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-35699740

RESUMEN

Non-targeted analysis (NTA) methods are widely used for chemical discovery but seldom employed for quantitation due to a lack of robust methods to estimate chemical concentrations with confidence limits. Herein, we present and evaluate new statistical methods for quantitative NTA (qNTA) using high-resolution mass spectrometry (HRMS) data from EPA's Non-Targeted Analysis Collaborative Trial (ENTACT). Experimental intensities of ENTACT analytes were observed at multiple concentrations using a semi-automated NTA workflow. Chemical concentrations and corresponding confidence limits were first estimated using traditional calibration curves. Two qNTA estimation methods were then implemented using experimental response factor (RF) data (where RF = intensity/concentration). The bounded response factor method used a non-parametric bootstrap procedure to estimate select quantiles of training set RF distributions. Quantile estimates then were applied to test set HRMS intensities to inversely estimate concentrations with confidence limits. The ionization efficiency estimation method restricted the distribution of likely RFs for each analyte using ionization efficiency predictions. Given the intended future use for chemical risk characterization, predicted upper confidence limits (protective values) were compared to known chemical concentrations. Using traditional calibration curves, 95% of upper confidence limits were within ~tenfold of the true concentrations. The error increased to ~60-fold (ESI+) and ~120-fold (ESI-) for the ionization efficiency estimation method and to ~150-fold (ESI+) and ~130-fold (ESI-) for the bounded response factor method. This work demonstrates successful implementation of confidence limit estimation strategies to support qNTA studies and marks a crucial step towards translating NTA data in a risk-based context.


Asunto(s)
Incertidumbre , Calibración , Espectrometría de Masas/métodos
13.
Environ Int ; 158: 107011, 2022 01.
Artículo en Inglés | MEDLINE | ID: mdl-35386928

RESUMEN

Chemical risk assessments follow a long-standing paradigm that integrates hazard, dose-response, and exposure information to facilitate quantitative risk characterization. Targeted analytical measurement data directly support risk assessment activities, as well as downstream risk management and compliance monitoring efforts. Yet, targeted methods have struggled to keep pace with the demands for data regarding the vast, and growing, number of known chemicals. Many contemporary monitoring studies therefore utilize non-targeted analysis (NTA) methods to screen for known chemicals with limited risk information. Qualitative NTA data has enabled identification of previously unknown compounds and characterization of data-poor compounds in support of hazard identification and exposure assessment efforts. In spite of this, NTA data have seen limited use in risk-based decision making due to uncertainties surrounding their quantitative interpretation. Significant efforts have been made in recent years to bridge this quantitative gap. Based on these advancements, quantitative NTA data, when coupled with other high-throughput data streams and predictive models, are poised to directly support 21st-century risk-based decisions. This article highlights components of the chemical risk assessment process that are influenced by NTA data, surveys the existing literature for approaches to derive quantitative estimates of chemicals from NTA measurements, and presents a conceptual framework for incorporating NTA data into contemporary risk assessment frameworks.


Asunto(s)
Gestión de Riesgos , Medición de Riesgo/métodos
14.
Environ Toxicol Chem ; 41(5): 1117-1130, 2022 05.
Artículo en Inglés | MEDLINE | ID: mdl-34416028

RESUMEN

Unknown chemical releases constitute a large portion of the rapid response situations to which the US Environmental Protection Agency is called on to respond. Workflows used to address unknown chemical releases currently involve screening for a large array of known compounds using many different targeted methods. When matches are not found, expert analytical chemistry knowledge is used to propose possible candidates from the available data, which generally includes low-resolution mass spectra and situational clues such as the location of the release, nearby industrial operations, and other field-reported facts. The past decade has witnessed dramatic improvements in capabilities for identifying unknown compounds using high-resolution mass spectrometry (HRMS) and nontargeted analysis (NTA) approaches. Complementary developments in cheminformatics tools have further enabled an increase in NTA throughput and identification confidence. Together with the expanding availability of HRMS instrumentation in monitoring laboratories, these advancements make NTA highly relevant to rapid response scenarios. In this article, we introduce the concept of NTA as it relates to rapid response needs and describe how it can be applied to address unknown chemical releases. We advocate for the consideration of HRMS-based NTA approaches to support future rapid response scenarios. Environ Toxicol Chem 2022;41:1117-1130. Published 2021. This article is a U.S. Government work and is in the public domain in the USA.


Asunto(s)
Espectrometría de Masas , Espectrometría de Masas/métodos , Estados Unidos , United States Environmental Protection Agency
15.
Anal Chem ; 93(49): 16289-16296, 2021 12 14.
Artículo en Inglés | MEDLINE | ID: mdl-34842413

RESUMEN

Non-targeted analysis (NTA) encompasses a rapidly evolving set of mass spectrometry techniques aimed at characterizing the chemical composition of complex samples, identifying unknown compounds, and/or classifying samples, without prior knowledge regarding the chemical content of the samples. Recent advances in NTA are the result of improved and more accessible instrumentation for data generation and analysis tools for data evaluation and interpretation. As researchers continue to develop NTA approaches in various scientific fields, there is a growing need to identify, disseminate, and adopt community-wide method reporting guidelines. In 2018, NTA researchers formed the Benchmarking and Publications for Non-Targeted Analysis Working Group (BP4NTA) to address this need. Consisting of participants from around the world and representing fields ranging from environmental science and food chemistry to 'omics and toxicology, BP4NTA provides resources addressing a variety of challenges associated with NTA. Thus far, BP4NTA group members have aimed to establish a consensus on NTA-related terms and concepts and to create consistency in reporting practices by providing resources on a public Web site, including consensus definitions, reference content, and lists of available tools. Moving forward, BP4NTA will provide a setting for NTA researchers to continue discussing emerging challenges and contribute to additional harmonization efforts.


Asunto(s)
Benchmarking , Humanos
16.
Anal Chem ; 93(41): 13870-13879, 2021 10 19.
Artículo en Inglés | MEDLINE | ID: mdl-34618419

RESUMEN

Non-targeted analysis (NTA) workflows using mass spectrometry are gaining popularity in many disciplines, but universally accepted reporting standards are nonexistent. Current guidance addresses limited elements of NTA reporting-most notably, identification confidence-and is insufficient to ensure scientific transparency and reproducibility given the complexity of these methods. This lack of reporting standards hinders researchers' development of thorough study protocols and reviewers' ability to efficiently assess grant and manuscript submissions. To overcome these challenges, we developed the NTA Study Reporting Tool (SRT), an easy-to-use, interdisciplinary framework for comprehensive NTA methods and results reporting. Eleven NTA practitioners reviewed eight published articles covering environmental, food, and health-based exposomic applications with the SRT. Overall, our analysis demonstrated that the SRT provides a valid structure to guide study design and manuscript writing, as well as to evaluate NTA reporting quality. Scores self-assigned by authors fell within the range of peer-reviewer scores, indicating that SRT use for self-evaluation will strengthen reporting practices. The results also highlighted NTA reporting areas that need immediate improvement, such as analytical sequence and quality assurance/quality control information. Although scores intentionally do not correspond to data/results quality, widespread implementation of the SRT could improve study design and standardize reporting practices, ultimately leading to broader use and acceptance of NTA data.


Asunto(s)
Proyectos de Investigación , Espectrometría de Masas , Estándares de Referencia , Reproducibilidad de los Resultados
17.
Anal Bioanal Chem ; 413(30): 7495-7508, 2021 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-34648052

RESUMEN

With the increasing availability of high-resolution mass spectrometers, suspect screening and non-targeted analysis are becoming popular compound identification tools for environmental researchers. Samples of interest often contain a large (unknown) number of chemicals spanning the detectable mass range of the instrument. In an effort to separate these chemicals prior to injection into the mass spectrometer, a chromatography method is often utilized. There are numerous types of gas and liquid chromatographs that can be coupled to commercially available mass spectrometers. Depending on the type of instrument used for analysis, the researcher is likely to observe a different subset of compounds based on the amenability of those chemicals to the selected experimental techniques and equipment. It would be advantageous if this subset of chemicals could be predicted prior to conducting the experiment, in order to minimize potential false-positive and false-negative identifications. In this work, we utilize experimental datasets to predict the amenability of chemical compounds to detection with liquid chromatography-electrospray ionization-mass spectrometry (LC-ESI-MS). The assembled dataset totals 5517 unique chemicals either explicitly detected or not detected with LC-ESI-MS. The resulting detected/not-detected matrix has been modeled using specific molecular descriptors to predict which chemicals are amenable to LC-ESI-MS, and to which form(s) of ionization. Random forest models, including a measure of the applicability domain of the model for both positive and negative modes of the electrospray ionization source, were successfully developed. The outcome of this work will help to inform future suspect screening and non-targeted analyses of chemicals by better defining the potential LC-ESI-MS detectable chemical landscape of interest.

18.
Environ Sci Technol ; 55(16): 11375-11387, 2021 08 17.
Artículo en Inglés | MEDLINE | ID: mdl-34347456

RESUMEN

Recycled materials are found in many consumer products as part of a circular economy; however, the chemical content of recycled products is generally uncharacterized. A suspect screening analysis using two-dimensional gas chromatography time-of-flight mass spectrometry (GC × GC-TOFMS) was applied to 210 products (154 recycled, 56 virgin) across seven categories. Chemicals in products were tentatively identified using a standard spectral library or confirmed using chemical standards. A total of 918 probable chemical structures identified (112 of which were confirmed) in recycled materials versus 587 (110 confirmed) in virgin materials. Identified chemicals were characterized in terms of their functional use and structural class. Recycled paper products and construction materials contained greater numbers of chemicals than virgin products; 733 identified chemicals had greater occurrence in recycled compared to virgin materials. Products made from recycled materials contained greater numbers of fragrances, flame retardants, solvents, biocides, and dyes. The results were clustered to identify groups of chemicals potentially associated with unique chemical sources, and identified chemicals were prioritized for further study using high-throughput hazard and exposure information. While occurrence is not necessarily indicative of risk, these results can be used to inform the expansion of existing models or identify exposure pathways currently neglected in exposure assessments.


Asunto(s)
Retardadores de Llama , Materiales de Construcción , Retardadores de Llama/análisis , Cromatografía de Gases y Espectrometría de Masas , Reciclaje
19.
J Expo Sci Environ Epidemiol ; 31(1): 70-81, 2021 02.
Artículo en Inglés | MEDLINE | ID: mdl-32661335

RESUMEN

Chemical exposure via dust ingestion is of great interest to researchers and regulators because children are exposed to dust through their daily activities, and as a result, to the many chemicals contained within dust. Our goal was to develop a workflow to identify and rank organic chemicals that could be used as tracers to calculate children's dust ingestion rates. We proposed a set of criteria for a chemical to be considered a promising tracer. The best tracers must be (1) ubiquitous in dust, (2) unique to dust, (3) detectable as biomarkers in accessible biological samples, and (4) have available or obtainable ADME information for biomarker-based exposure reconstruction. To identify compounds meeting these four criteria, we developed a workflow that encompasses non-targeted analysis approaches, literature and database searching, and multimedia modeling. We then implemented an ad hoc grading system and ranked candidate chemicals based on fulfillment of our criteria (using one small, publicly available dataset to show proof of concept). Initially, five chemicals (1,3-diphenylguanidine, leucine, piperine, 6:2/8:2 fluorotelomer phosphate diester, 6:2 fluorotelomer phosphate diester) appeared to satisfy many of our criteria. However, a rigorous manual investigation raised many questions about the applicability of these chemicals as tracers. Based on the results of this initial pilot study, no individual compounds can be unequivocally considered suitable tracers for calculating dust ingestion rates. Future work must therefore consider larger datasets, generated from broader measurement studies and literature searches, as well as refinements to selection criteria, to identify robust and defensible tracer compounds.


Asunto(s)
Polvo , Monitoreo del Ambiente , Niño , Polvo/análisis , Ingestión de Alimentos , Exposición a Riesgos Ambientales/análisis , Humanos , Organofosfatos , Proyectos Piloto
20.
Metabolites ; 10(6)2020 Jun 23.
Artículo en Inglés | MEDLINE | ID: mdl-32585902

RESUMEN

Software applications for high resolution mass spectrometry (HRMS)-based non-targeted analysis (NTA) continue to enhance chemical identification capabilities. Given the variety of available applications, determining the most fit-for-purpose tools and workflows can be difficult. The Critical Assessment of Small Molecule Identification (CASMI) contests were initiated in 2012 to provide a means to evaluate compound identification tools on a standardized set of blinded tandem mass spectrometry (MS/MS) data. Five CASMI contests have resulted in recommendations, publications, and invaluable datasets for practitioners of HRMS-based screening studies. The US Environmental Protection Agency's (EPA) CompTox Chemicals Dashboard is now recognized as a valuable resource for compound identification in NTA studies. However, this application was too new and immature in functionality to participate in the five previous CASMI contests. In this work, we performed compound identification on all five CASMI contest datasets using Dashboard tools and data in order to critically evaluate Dashboard performance relative to that of other applications. CASMI data was accessed via the CASMI webpage and processed for use in our spectral matching and identification workflow. Relative to applications used by former contest participants, our tools, data, and workflow performed well, placing more challenge compounds in the top five of ranked candidates than did the winners of three contest years and tying in a fourth. In addition, we conducted an in-depth review of the CASMI structure sets and made these reviewed sets available via the Dashboard. Our results suggest that Dashboard data and tools would enhance chemical identification capabilities for practitioners of HRMS-based NTA.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...